1,003 research outputs found

    A time-series method to identify and correct range sidelobes in meteorological radar data

    Get PDF
    The use of pulse compression techniques to improve the sensitivity of meteorological radars has become increasingly common in recent years. An unavoidable side-effect of such techniques is the formation of ‘range sidelobes’ which lead to spreading of information across several range gates. These artefacts are particularly troublesome in regions where there is a sharp gradient in the power backscattered to the antenna as a function of range. In this article we present a simple method for identifying and correcting range sidelobe artefacts. We make use of the fact that meteorological targets produce an echo which fluctuates at random, and that this echo, like a fingerprint, is unique to each range gate. By cross-correlating the echo time series from pairs of gates therefore we can identify whether information from one gate has spread into another, and hence flag regions of contamination. In addition we show that the correlation coefficients contain quantitative information about the fraction of power leaked from one range gate to another, and we propose a simple algorithm to correct the corrupted reflectivity profile

    The potential of 1 h refractivity changes from an operational C-band magnetron-based radar for numerical weather prediction validation and data assimilation

    Get PDF
    Refractivity changes (ΔN) derived from radar ground clutter returns serve as a proxy for near-surface humidity changes (1 N unit ≡ 1% relative humidity at 20 °C). Previous studies have indicated that better humidity observations should improve forecasts of convection initiation. A preliminary assessment of the potential of refractivity retrievals from an operational magnetron-based C-band radar is presented. The increased phase noise at shorter wavelengths, exacerbated by the unknown position of the target within the 300 m gate, make it difficult to obtain absolute refractivity values, so we consider the information in 1 h changes. These have been derived to a range of 30 km with a spatial resolution of ∼4 km; the consistency of the individual estimates (within each 4 km × 4 km area) indicates that ΔN errors are about 1 N unit, in agreement with in situ observations. Measurements from an instrumented tower on summer days show that the 1 h refractivity changes up to a height of 100 m remain well correlated with near-surface values. The analysis of refractivity as represented in the operational Met Office Unified Model at 1.5, 4 and 12 km grid lengths demonstrates that, as model resolution increases, the spatial scales of the refractivity structures improve. It is shown that the magnitude of refractivity changes is progressively underestimated at larger grid lengths during summer. However, the daily time series of 1 h refractivity changes reveal that, whereas the radar-derived values are very well correlated with the in situ observations, the high-resolution model runs have little skill in getting the right values of ΔN in the right place at the right time. This suggests that the assimilation of these radar refractivity observations could benefit forecasts of the initiation of convection

    Tumor necrosis factor enhances the capsaicin sensitivity of rat sensory neurons

    Get PDF
    The capacity of the proinflammatory cytokines, tumor necrosis factor alpha (TNF alpha) and interleukin 1 beta (IL-1 beta), to modulate the sensitivity of isolated sensory neurons grown in culture to the excitatory chemical agent capsaicin was examined. Alterations in capsaicin sensitivity were assessed by quantifying the number of neurons labeled with cobalt after exposure to capsaicin and by recording the whole-cell response from a single neuron to the focal application of capsaicin. A 24 hr pretreatment of the neuronal cultures with TNF alpha (10 or 50 ng/ml), but not IL-1 beta (10 or 50 ng/ml), produced a concentration-dependent increase in the number of cobalt-labeled neurons after exposure to 100 nM capsaicin. The peak increase in the number of labeled neurons was attained after a 4 hr treatment with 10 ng/ml TNF alpha. Similarly, pretreatment with TNF alpha (10 ng/ml for 4, 12, and 24 hr) produced a greater than twofold increase in the average peak amplitude of the inward current evoked by 100 nM capsaicin. Both the TNF alpha-induced increase in labeling and current amplitude were blocked by treating the neuronal cultures with indomethacin before the addition of TNF alpha. Enhancement of the capsaicin-evoked current also was blocked by the specific cyclo-oxygenase-2 inhibitor SC-236. These results indicate that TNF alpha can enhance the sensitivity of sensory neurons to the excitation produced by capsaicin and that this enhancement likely is mediated by the neuronal production of prostaglandins. Isolated sensory neurons grown in culture may prove to be a useful model system in which to explore how prolonged exposure to mediators associated with chronic inflammation alter the regulatory pathways that modulate the excitability of the nervous system

    Complementary role of cardiac CT in the assessment of aortic valve replacement dysfunction

    Get PDF
    Aortic valve replacement is the second most common cardiothoracic procedure in the UK. With an ageing population, there are an increasing number of patients with prosthetic valves that require follow-up. Imaging of prosthetic valves is challenging with conventional echocardiographic techniques making early detection of valve dysfunction or complications difficult. CT has recently emerged as a complementary approach offering excellent spatial resolution and the ability to identify a range of aortic valve replacement complications including structural valve dysfunction, thrombus development, pannus formation and prosthetic valve infective endocarditis. This review discusses each and how CT might be incorporated into a multimodal cardiovascular imaging pathway for the assessment of aortic valve replacements and in guiding clinical management

    Precision scans of the pixel cell response of double sided 3D pixel detectors to pion and x-ray beams

    Get PDF
    hree-dimensional (3D) silicon sensors offer potential advantages over standard planar sensors for radiation hardness in future high energy physics experiments and reduced charge-sharing for X-ray applications, but may introduce inefficiencies due to the columnar electrodes. These inefficiencies are probed by studying variations in response across a unit pixel cell in a 55μm pitch double-sided 3D pixel sensor bump bonded to TimePix and Medipix2 readout ASICs. Two complementary characterisation techniques are discussed: the first uses a custom built telescope and a 120GeV pion beam from the Super Proton Synchrotron (SPS) at CERN; the second employs a novel technique to illuminate the sensor with a micro-focused synchrotron X-ray beam at the Diamond Light Source, UK. For a pion beam incident perpendicular to the sensor plane an overall pixel efficiency of 93.0±0.5% is measured. After a 10o rotation of the device the effect of the columnar region becomes negligible and the overall efficiency rises to 99.8±0.5%. The double-sided 3D sensor shows significantly reduced charge sharing to neighbouring pixels compared to the planar device. The charge sharing results obtained from the X-ray beam study of the 3D sensor are shown to agree with a simple simulation in which charge diffusion is neglected. The devices tested are found to be compatible with having a region in which no charge is collected centred on the electrode columns and of radius 7.6±0.6μm. Charge collection above and below the columnar electrodes in the double-sided 3D sensor is observed

    The DYMECS project: a statistical approach for the evaluation of convective storms in high-resolution NWP models

    Get PDF
    A new frontier in weather forecasting is emerging by operational forecast models now being run at convection-permitting resolutions at many national weather services. However, this is not a panacea; significant systematic errors remain in the character of convective storms and rainfall distributions. The DYMECS project (Dynamical and Microphysical Evolution of Convective Storms) is taking a fundamentally new approach to evaluate and improve such models: rather than relying on a limited number of cases, which may not be representative, we have gathered a large database of 3D storm structures on 40 convective days using the Chilbolton radar in southern England. We have related these structures to storm life-cycles derived by tracking features in the rainfall from the UK radar network, and compared them statistically to storm structures in the Met Office model, which we ran at horizontal grid length between 1.5 km and 100 m, including simulations with different subgrid mixing length. We also evaluated the scale and intensity of convective updrafts using a new radar technique. We find that the horizontal size of simulated convective storms and the updrafts within them is much too large at 1.5-km resolution, such that the convective mass flux of individual updrafts can be too large by an order of magnitude. The scale of precipitation cores and updrafts decreases steadily with decreasing grid lengths, as does the typical storm lifetime. The 200-m grid-length simulation with standard mixing length performs best over all diagnostics, although a greater mixing length improves the representation of deep convective storms

    Deep-Learning for Epicardial Adipose Tissue Assessment with Computed Tomography: Implications for Cardiovascular Risk Prediction

    Get PDF
    Background: Epicardial adipose tissue (EAT) volume is a marker of visceral obesity that can be measured in coronary computed tomography angiograms (CCTA). The clinical value of integrating this measurement in routine CCTA interpretation has not been documented./ Objectives: This study sought to develop a deep-learning network for automated quantification of EAT volume from CCTA, test it in patients who are technically challenging, and validate its prognostic value in routine clinical care./ Methods: The deep-learning network was trained and validated to autosegment EAT volume in 3,720 CCTA scans from the ORFAN (Oxford Risk Factors and Noninvasive Imaging Study) cohort. The model was tested in patients with challenging anatomy and scan artifacts and applied to a longitudinal cohort of 253 patients post-cardiac surgery and 1,558 patients from the SCOT-HEART (Scottish Computed Tomography of the Heart) Trial, to investigate its prognostic value./ Results: External validation of the deep-learning network yielded a concordance correlation coefficient of 0.970 for machine vs human. EAT volume was associated with coronary artery disease (odds ratio [OR] per SD increase in EAT volume: 1.13 [95% CI: 1.04-1.30]; P = 0.01), and atrial fibrillation (OR: 1.25 [95% CI:1.08-1.40]; P = 0.03), after correction for risk factors (including body mass index). EAT volume predicted all-cause mortality (HR per SD: 1.28 [95% CI: 1.10-1.37]; P = 0.02), myocardial infarction (HR: 1.26 [95% CI:1.09-1.38]; P = 0.001), and stroke (HR: 1.20 [95% CI: 1.09-1.38]; P = 0.02) independently of risk factors in SCOT-HEART (5-year follow-up). It also predicted in-hospital (HR: 2.67 [95% CI: 1.26-3.73]; P ≤ 0.01) and long-term post–cardiac surgery atrial fibrillation (7-year follow-up; HR: 2.14 [95% CI: 1.19-2.97]; P ≤ 0.01). Conclusions: Automated assessment of EAT volume is possible in CCTA, including in patients who are technically challenging; it forms a powerful marker of metabolically unhealthy visceral obesity, which could be used for cardiovascular risk stratification

    Impact of Xpert MTB/RIF for TB diagnosis in a primary care clinic with high TB and HIV prevalence in South Africa: a pragmatic randomised trial

    Get PDF
    Background: Xpert MTB/RIF is approved for use in tuberculosis (TB) and rifampicin-resistance diagnosis. However, data are limited on the impact of Xpert under routine conditions in settings with high TB burden. Methods and Findings: A pragmatic prospective cluster-randomised trial of Xpert for all individuals with presumptive (symptomatic) TB compared to the routine diagnostic algorithm of sputum microscopy and limited use of culture was conducted in a large TB/HIV primary care clinic. The primary outcome was the proportion of bacteriologically confirmed TB cases not initiating TB treatment by 3 mo after presentation. Secondary outcomes included time to TB treatment and mortality. Unblinded randomisation occurred on a weekly basis. Xpert and smear microscopy were performed on site. Analysis was both by intention to treat (ITT) and per protocol. Between 7 September 2010 and 28 October 2011, 1,985 participants were assigned to the Xpert (n = 982) and routine (n = 1,003) diagnostic algorithms (ITT analysis); 882 received Xpert and 1,063 routine (per protocol analysis). 13% (32/257) of individuals with bacteriologically confirmed TB (smear, culture, or Xpert) did not initiate treatment by 3 mo after presentation in the Xpert arm, compared to 25% (41/167) in the routine arm (ITT analysis, risk ratio 0.51, 95% CI 0.33–0.77, p = 0.0052). The yield of bacteriologically confirmed TB cases among patients with presumptive TB was 17% (167/1,003) with routine diagnosis and 26% (257/982) with Xpert diagnosis (ITT analysis, risk ratio 1.57, 95% CI 1.32–1.87, p<0.001). This difference in diagnosis rates resulted in a higher rate of treatment initiation in the Xpert arm: 23% (229/1,003) and 28% (277/982) in the routine and Xpert arms, respectively (ITT analysis, risk ratio 1.24, 95% CI 1.06–1.44, p = 0.013). Time to treatment initiation was improved overall (ITT analysis, hazard ratio 0.76, 95% CI 0.63–0.92, p = 0.005) and among HIV-infected participants (ITT analysis, hazard ratio 0.67, 95% CI 0.53–0.85, p = 0.001). There was no difference in 6-mo mortality with Xpert versus routine diagnosis. Study limitations included incorrect intervention allocation for a high proportion of participants and that the study was conducted in a single clinic. Conclusions: These data suggest that in this routine primary care setting, use of Xpert to diagnose TB increased the number of individuals with bacteriologically confirmed TB who were treated by 3 mo and reduced time to treatment initiation, particularly among HIV-infected participants

    NEB mutations disrupt the super-relaxed state of myosin and remodel the muscle metabolic proteome in nemaline myopathy

    Get PDF
    Nemaline myopathy (NM) is one of the most common non-dystrophic genetic muscle disorders. NM is often associated with mutations in the NEB gene. Even though the exact NEB-NM pathophysiological mechanisms remain unclear, histological analyses of patients' muscle biopsies often reveal unexplained accumulation of glycogen and abnormally shaped mitochondria. Hence, the aim of the present study was to define the exact molecular and cellular cascade of events that would lead to potential changes in muscle energetics in NEB-NM. For that, we applied a wide range of biophysical and cell biology assays on skeletal muscle fibres from NM patients as well as untargeted proteomics analyses on isolated myofibres from a muscle-specific nebulin-deficient mouse model. Unexpectedly, we found that the myosin stabilizing conformational state, known as super-relaxed state, was significantly impaired, inducing an increase in the energy (ATP) consumption of resting muscle fibres from NEB-NM patients when compared with controls or with other forms of genetic/rare, acquired NM. This destabilization of the myosin super-relaxed state had dynamic consequences as we observed a remodeling of the metabolic proteome in muscle fibres from nebulin-deficient mice. Altogether, our findings explain some of the hitherto obscure hallmarks of NM, including the appearance of abnormal energy proteins and suggest potential beneficial effects of drugs targeting myosin activity/conformations for NEB-NM.Peer reviewe
    • …
    corecore